The Gaussian Many-Help-One Distributed Source Coding Problem

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Distributed Source Coding of Correlated Gaussian Sources

We consider the distributed source coding system of L correlated Gaussian sources Yl, l = 1, 2, · · · , L which are noisy observations of correlated Gaussian remote sources Xk, k = 1, 2, · · · ,K. We assume that Y L = (Y1, Y2, · · · , YL) is an observation of the source vector X = (X1, X2, · · · , XK), having the form Y L = AX+N, where A is a L×K matrix and N = (N1, N2, · · · , NL) is a vector ...

متن کامل

Many-Help-One Problem for Gaussian Sources with a Tree Structure on their Correlation

In this paper we consider the separate coding problem for L + 1 correlated Gaussian memoryless sources. We deal with the case where L separately encoded data of sources work as side information at the decoder for the reconstruction of the remaining source. The determination problem of the rate distortion region for this system is the so called many-help-one problem and has been known as a highl...

متن کامل

On the Many-Help-One Problem with Independently Degraded Helpers

This work provides new results for the separated encoding of correlated discrete memoryless sources. We address the scenario where an arbitrary number of auxiliary sources (a.k.a. helpers) are supposed to help decode a single primary source errorless. The rate-distortion analysis of such a system establishes the so-called many-help-one problem. We focus on the case in which the auxiliary source...

متن کامل

On the Gaussian Listening-Helper Source-Coding Problem

In the Gaussian listening-helper source-coding problem Encoder 1 and Decoder 1 observe respectively a pair of i.i.d. correlated Gaussian sources and both wish to communicate the first source to Decoder 2 subject to a distortion constraint. Encoder 1 sends a message to both decoders and then Decoder 1 – the listening-helper – sends a message just to Decoder 2. We derive an inner bound on the rat...

متن کامل

Distributed Source Coding for Correlated Memoryless Gaussian Sources

We consider a distributed source coding problem of L correlated Gaussian observations Yi, i = 1, 2, · · · , L. We assume that the random vector Y L = t(Y1, Y2, · · · , YL) is an observation of the Gaussian random vector X = t(X1, X2, · · · , XK), having the form Y L = AX +N , where A is a L×K matrix and N = t(N1, N2, · · · , NL) is a vector of L independent Gaussian random variables also indepe...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Information Theory

سال: 2010

ISSN: 0018-9448,1557-9654

DOI: 10.1109/tit.2009.2034791